API Reference

Commands and responses

Intent

Runtime version: v3.x or later

intent() is a predefined function to define a voice or text command. In the function, you specify expected user inputs — patterns, conditions on when the command must be available for the user — visual filters, and actions that must occur when the user’s input matches one of the patterns. For details, see Intent.

Syntax

Function syntax

intent([filter,] [noctx,] pattern1 [, pattern2, …, patternN], action)

Parameters

Name

Type

Description

filter

function

Defines the conditions on when the intent can be invoked.

noctx

function

Signals that the intent must not switch the current context. For details, see noContext intents.

pattern

string

Comma separated strings, each represents a pattern invoking the intent.

action

function

Defines what actions must be taken when one of the intent patterns is matched. Either an anonymous arrow function or the reply() function.

Play

Runtime version: v3.x or later

play() is a predefined function used to provide voice or text responses or to send JSON commands to the web/mobile client app. If more than one response is passed to the play() function, only one response will be played at random. For details, see play().

Syntax

Function syntax

play([voice(lang),] response1 [, response2, …, responseN] [, opts(options)])

Or

Function syntax

play(command [, opts(options)])

Parameters

Name

Type

Description

voice

function

Settings defining how the voice of the assistant must be customized. voice() takes the following settings:

  • language: language, or accent, in which the response must be played - en, fr, de, it, ru, es

  • gender: Voice gender - male or female

  • type: voice type - 0 (standard for the selected gender) or 1 (custom voice)

  • pitch: Speaking pitch in the range [-20.0, 20.0]

  • rate: Speaking rate in the range [0.25, 4]

For details, see Voice settings.

response

string/number

Comma separated strings or numbers, each represents a pattern of a voice response from Alan AI.

command

object

An arbitrary JSON object used to send commands to the client app.

opts

function

Options defining how the play() function must be executed in the client app. opts() takes an object with the one or more options:

  • force:true: execute a command even if the assistant button is not active in the client app

  • activate:true: activate the assistant button in the client app before a response is played or command is executed

  • deactivate:true: deactivate the assistant button in the client app after a response is played or command is executed

For details, see Specifying play options.

Reply

Runtime version: v3.x or later

reply() is a predefined action function providing the specified voice or text response to the user. If more than one response is passed to the reply() function, only one response will be played at random. For details, see reply().

Syntax

Function syntax

reply(response1 [, response2, …, responseN])

Parameters

Name

Type

Description

response

string

Comma separated strings, each represents a pattern of a response from Alan AI.

Contexts

Then

Runtime version: v3.x or later

then() is a predefined function used to activate the context. If you need to share data between the current context and the activated one, you can pass this data with the state object. For details, see Activating contexts.

Syntax

Function syntax

then(context[, state])

Parameters

Name

Type

Description

context

function

Represents the variable name with which context is defined.

state

object

Predefined object that exists in every context.

Resolve

Runtime version: v3.x or later

resolve() is a predefined function used to manually deactivate the current context and return to its parent context. You can pass any data to it, this data will be available in the parrent context. For details, see Exiting contexts.

Syntax

Function syntax

resolve([returnValue])

Parameters

Name

Type

Description

returnValue

object

Represents the object returned to the parent context.

Fallback

Runtime version: v3.x or later

For each context, you can define a fallback response which will be activated if this context is active and no intent from this context has been matched. For details, see Error handling and re-prompts.

Syntax

Function syntax

fallback(pattern1 [, pattern2, …, patternN])

Parameters

Name

Type

Description

pattern

string

Comma separated strings, each represents a pattern of a response from Alan AI.

noContext

Runtime version: v3.x or later

The noContext function wraps all intents that must not switch the current context, for example, general questions in the dialog. For details, see noContext intents.

Syntax

Function syntax

noContext(() => {intent1 [, intent2, …, intentN]});

Parameters

Name

Type

Description

intent

function

An intent that must not switch the current context when invoked.

State

Runtime version: v3.x or later

Each context has a special predefined object — state. You can access it via p.state. This object should be treated as the knowledge base that is available to Alan AI in the current conversational context. You can store any data in it. For details, see state.

onEnter

Runtime version: v3.x or later

onEnter() is a special predefined callback activated each time the script enters a context. For details, see Using onEnter() function.

Syntax

Function syntax

onEnter(action)

Parameters

Name

Type

Description

action

function

Defines what actions must be taken when the context is activated.

Title

Runtime version: v3.x or later

title() is a special predefined function used to label a context. For details, see Labeling contexts.

Syntax

Function syntax

title(contextName)

Parameters

Name

Type

Description

contextName

string

Represents the name of a context that will be shown in logs

resetContext

Runtime version: v3.x or later

resetContext() is a special predefined function used to deactivate all current contexts at a time. For details, see Resetting contexts.

Syntax

Function syntax

resetContext()

Session-specific objects and methods

Session-specific objects and methods are available via the predefined p object. These methods and objects persist during a user session, until the session is terminated. The user session is terminated after 30 minutes of inactivity or if the user quits the app.

userData

Runtime version: v3.x or later

p.userData is a runtime object that can be used to store any data. You can access it at any time from any script of your project regardless of the context. Take a note that the data stored in p.userData is available only during the given user session. For details, see userData.

authData

Runtime version: v3.x or later

p.authData is a runtime object that can be used to provide static device- or user-specific data, such as the user’s credentials, to Alan AI. If you need to receive dynamic data from the app, use the visual state instead.

For details, see authData.

visual

Runtime version: v3.x or later

p.visual is a runtime object that can contain an arbitrary JSON object. It should be used to provide any dynamic data of the app state to Alan’s script or project. For details, see Visual state.

Global objects and methods

project

Runtime version: v3.x or later

project is a global object that can be used to store any data you may need in separate dialog scripts of your project. When Alan AI builds the dialog model for your project, it loads scripts in the order defined in the scripts panel, from top to bottom. The project object will be available in any script that is below the script where it is defined.

Dialog script
// Script 1
project.config = {initValue: 1};

// Script 2
console.log(`Init value is ${project.config.initValue}`);

project API

Runtime version: v3.x or later

The project API can be used if you want to send data from the client app to the dialog script or perform some script logic without a voice or text command from the user. This can be done by setting up the logic for the project API and then calling it with Alan AI SDK method — callProjectApi(). For details, see Project API.

Syntax

Function syntax

projectAPI.functionName = function(p, data, callback) {}

Parameters

Name

Type

Description

p

object

Predefined object containing the user session data and exposing Alan AI’s methods.

data

object

An object containing the data you want to pass to your script.

callback

function

A callback function used to receive data back to the app.

Example

Dialog script
projectAPI.setToken = function(p, param, callback) {
    if (!param || !param.token) {
        callback("error: token is undefined");
    }
    p.userData.token = param.token;
    callback();
};

Generative AI

You can use the following functions and APIs to let your AI assistant query the data corpus and provide a response to the user:

corpus

Runtime version: v3.x or later

corpus is a predefined function that allow specifying from which websites and resources to retrieve content to conduct a conversation between the AI assistant and users. For details, see Q&A virtual assistant.

Syntax

Function syntax

corpus(resource1 [, resource2, …, resourceN])

Parameters

Name

Type

Description

resource

object or string

Defines:

  • URL of a resource from which data is retrieved, the crawl dept and maximum number of pages to index

  • Plain text string to be used for user queries

Example

Dialog script
corpus(
    {url: "https://alan.app/", depth: 2, maxPages: 50},
    {url: "https://alan.app/docs/", depth: 2, maxPages: 100},
);

corpus(`
    Alan AI is a complete Actionable AI platform to build, deploy and manage AI Assistants in a few days.
    With Alan AI, a conversational experience for your app can be built by a single developer, rather than a team of Machine Learning and DevOps experts.
`);

generator API

Runtime version: v4.1 or later

The generator API allows generating a structured response — report, product descrription and so on — based on the JSON-formatted data provided to it. For details, see Generator API.

Syntax

Function syntax

api.generator_v1({format, formatting});

generator.generate({data, query});

Name

Type

Description

format

format type

format: 'markdown'

formatting

string

Prompt on how to format the response, for example: formatting: 'highlight headers and bold key entities such as names, dates, addresses etc.'

data

object

Object with data used for response generation, for example: data:productInfo

query

string

Prompt on what to include in the response

Output

The generator API returns an output data stream with the response to the query.

Methods

The gather() method is used to collect (gather) the final response.

Example

In the example below, when the user says: Tell me about iPhone 13 Pro, include only its characteristics and image, Alan AI generates a structured response in a form of a product card containing the product characteristics and image provided in the productList array.

Dialog script
let productList = [
    {
        "name": "iPhone 13 Pro",
        "min_price":"$999",
        "vendor":"Apple",
        "spec":"A15 Bionic chip, Super Retina XDR display, ProMotion technology, Triple 12MP camera system, 5G capability, Face ID, Ceramic Shield front cover, and Water and dust resistance.",
        "image":"https://www.apple.com/newsroom/images/product/iphone/standard/Apple_iPhone-13-Pro_iPhone-13-Pro-Max_09142021_inline.jpg"
    },
    {
        "name": "Samsung Galaxy S21 Ultra",
        "min_price":"$1,199",
        "vendor":"Samsung",
        "spec":"Exynos 2100 or Snapdragon 888 processor, Dynamic AMOLED 2X display, Quad HD+ resolution, 108MP quad camera system, 5G capability, In-display fingerprint sensor, S Pen support, and Water and dust resistance.",
        "image":"https://image-us.samsung.com/us/smartphones/galaxy-s21/galaxy-s21-5g/v6/models/images/galaxy-s21-plus-5g_models_colors_phantom-violet.png"
    }
]

let generator = api.generator_v1({
    format: 'markdown',
    formatting: 'highlight headers and bold key entities such as names, dates, addresses etc.'
});

intent(
    `Tell me about $(MODEL iPhone 13 Pro|Samsung Galaxy S21 Ultra), (include|) $(R* .+)`,
    async p=> {
        let productInfo = productList.find(el => el.name === p.MODEL.value);
        let instruct = 'product details: ' + p.R.value;
        let stream = generator.generate({data: productInfo, instruct});

        p.play(`Retrieving product data...`);
        p.play(stream);
        let answer = await stream.gather();
        console.log(answer.answer);
    });

query API

Runtime version: v4.1 or later

The query API allows querying corpus data programmatically to retrieve a response that can be further used in the dialog flow. For details, see Query API.

Syntax

Function syntax

api.query_v1({corpus, query})

Parameters

Name

Type

Description

corpus

text corpus

Corpus data to be queried, for example: corpus: FAQ

query

string

Query to be run against the corpus, for example: query: "How to reset a smartphone?"

Output

The query API returns a response object with the following data:

Name

Description

summary

Summary of the answer generated by the query API

query

Detailed answer generated by the query API

Methods

The gather() method is used to collect (gather) the final response.

Example

In the example below, when the user asks a question: Can you tell me why my smartphone battery is draining quickly?, Alan AI queries the FAQ corpus and collects the final response with the gather() method. The final response is played to the user, the response summary and detailed answer are written to Alan Studio logs.

Dialog script
let FAQ = corpus(`
    Why is my smartphone battery draining quickly? Possible causes include excessive app usage, background processes, high screen brightness, push notifications, and weak cellular signals. To address this, close unnecessary apps, reduce screen brightness, disable push notifications for non-essential apps, and keep your device updated.
    How do I fix a frozen or unresponsive smartphone? Try a soft reset by holding the power button for 10 seconds. If that fails, perform a force restart by holding the power and volume down buttons for 10-15 seconds.
`);

intent("Can you tell me $(QUERY* .+)", async p => {
    p.play("Just a second...");
    let answer = await api.query_v1({corpus: FAQ, question: p.QUERY.value});
    if (answer) {
        p.play(answer);
        let final = await answer.gather();
        console.log(final.summary);
        console.log(final.answer);
    } else {
        p.play(`Sorry, I cannot help you with that`);
    }
});

Data transformation

You can use the following function for data preprocessing and transformation: transforms.

transforms

Runtime version: v4.2 or later

The transforms() function can be used to pre-process and format the input data to the required format using the defined template. For details, see Data transformation.

Syntax

Function syntax

transforms.transformName({input, query})

Parameters

Name

Type

Description

tranformName

string

Name of the transform created in the AI assistant project.

input

object

An object containing the data input passed to the transform.

query

function

An object containing the transform query.

Example

In the example below, the text in the input is formatted by the rules specified in the format transform:

  • Common query: The input contains user data in JSON, the query contains fields description, the result contains formatted text

  • Input: JSON, {"name": "Jerry Welsh", "age": "16", "address": "3456 Oak Street"}

  • Query: Name is the user's name, age is the user's age, address is the user's address

  • Result: Markdown

Results field
## User: user's name

- **Name**: user's name
- **Age**: user's age
- **Address**: user's address
Dialog script
intent(`Show user data`, async (p)=> {
    let u = await transforms.format({
        input: {"name": "John Smith", "age": "56", "address": "1234 Main Street"},
        query: "Name is the user's name, age is the user's age, address is the user's address"
    });
    p.play(u);
});

Speech recognition biasing

Recognition hints

Runtime version: v3.x or later

recognitionHints allow you to provide a list of hints for the AI assistant to help it recognize specific terms, words or phrase patterns that are not used in everyday life but are expected in the user’s input. Use it to avoid frequent errors in speech recognition.

Syntax

Function syntax

recognitionHints(recognitionHint1 [, recognitionHint2, …, recognitionHintN])

Parameters

Name

Type

Description

recognitionHint

string

Comma separated strings (hints), each represents a pattern of the user’s input.

Example

Dialog script
recognitionHints("Open ICIK");

intent('Open $(PARAM~ ICIK~ICIK)', p => {
    p.play(`Opening ${p.PARAM.label}`);
});

Homophone

Runtime version: v3.x or later

homophone() helps eliminate ambiguity and improve matching of intents containing terminology or domain-specific words. In the homophone function, you must specify a term expected in the user’s input and a list of its homophones — words that are pronounced in a similar way but are different in meaning. This will prevent the AI assistant from failing to match the intent when the user mispronounces or mistypes this term.

Note

To create a list of homophones for a term, you can check unrecognized phrases in Alan AI analytics.

Syntax

Function syntax

homophone(term, homophone1 [, homophone2, …, homophoneN] )

Parameters

Name

Type

Description

term

string

The expected term to be matched.

homophone

string

List of homophones for the term.

Example

Dialog script
homophone('flour', 'flower', 'floor');
homophone('cereal', 'serial');

intent('Add $(ITEM flour|cereal) to my shopping list', p => {
    p.play(`Sure, how much ${p.ITEM.value} would you like to add?`)
});

Predefined callbacks

After you create an AI assistant with Alan AI, the dialog goes through several states: the project is created, the user connects to the app and so on. When the dialog state changes, you may want to perform activities that are appropriate to this state. For example, when a new user connects to the dialog session, it may be necessary to set user-specific data.

To perform actions at different stages of the dialog lifecycle, you can use the following predefined callback functions:

onCreateProject

Runtime version: v3.x or later

onCreateProject is invoked when the dialog model for the dialog script is created. You can use this function to perform activities required immediately after the dialog model is built, for example, for any data initialization.

Syntax

Function syntax

onCreateProject(()=> {action})

Parameters

Name

Type

Description

action

function

Defines what actions must be taken when the dialog model is created on the server in Alan AI Cloud.

Example

In the example below, onCreateProject is used to define values for project.drinks.

Dialog script
onCreateProject(() => {
    project.drinks = "green tea|black tea|oolong";
});

onCreateUser

Runtime version: v3.x or later

onCreateUser is invoked when a new user starts a dialog session. You can use this function to set user-specific data.

Syntax

Function syntax

onCreateUser(p => {action})

Parameters

Name

Type

Description

p

object

Predefined object containing the user session data and exposing Alan AI’s methods.

action

function

Defines what actions must be taken when a new user starts a dialog session.

Example

In the example below, the onCreateUser function is used to assign the value to p.userData.favorites:

Dialog script
onCreateUser(p => {
    p.userData.favorites = "nature|landscape|cartoons";
});

onCleanupUser

Runtime version: v3.x or later

onCleanupUser is invoked when the user session ends. You can use this function for any cleanup activities.

Syntax

Function syntax

onCleanupUser(p => {action})

Parameters

Name

Type

Description

p

object

Predefined object containing the user session data and exposing Alan AI’s methods.

action

function

Defines what actions must be taken when the user session ends.

Example

In the example below, the onCleanupUser function is used to reset p.userData.favorites value:

Dialog script
onCleanupUser(p => {
    p.userData.favorites = "";
});

onVisualState

Runtime version: v3.x or later

onVisualState is invoked when the visual state object is set. You can use this function, for example, to process data passed in the visual state.

Syntax

Function syntax

onVisualState((p, s) => {action})

Parameters

Name

Type

Description

p

object

Predefined object containing the user session data and exposing Alan AI’s methods.

s

object

JSON object passed with the visual state.

action

function

Defines what actions must be taken when the visual state is sent.

Example

In the example below, the onVisualState function is used to define values for dynamic slots created with the help of the visual state.

Setting the visual state in the app:

Client app
<script>
  function myFunction() {
    let values = ["pasta", "burger", "quesadilla"]
    alanBtnInstance.setVisualState({values});
  }
</script>

Retrieving values and making the items list in the dialog script

Dialog script
onVisualState((p, s) => {
    if (s.values) {
        p.visual.menu = { en: ""};
        p.visual.menu.en = s.values.join('|');
    }
});

intent('Give me a $(ITEM~ v:menu)', p => {
    p.play(`Adding one ${p.ITEM.value} to your order`);
});

onUserEvent

Runtime version: v3.x or later

onUserEvent is invoked when Alan AI emits a specific event driven by users’ interactions with the AI assistant. For the events list, see User events.

Syntax

Function syntax

onUserEvent((p, e) => {action})

Parameters

Name

Type

Description

p

object

Predefined object containing the user session data and exposing Alan AI’s methods.

e

object

Event fired by Alan AI.

action

function

Defines what actions must be taken when the event is fired.

Example

In the example below, the AI assistant listens to the firstActivate event and, if the user activates the assistant for the first time, plays a greeting to the user.

Dialog script
onUserEvent((p, e)=> {
    console.log(e);
    if (e.event === "firstActivate") {
        p.play('Hi, this is Alan, your AI assistant! You can talk to me, ask questions and perform tasks with voice or text commands.');
    }
});

Debugging

console.log

Runtime version: v3.x or later

The console.log() function is used to write all sort of info messages to Alan AI Studio logs. With it, you can check objects, variables and slots values, capture events and so on.

Syntax

Function syntax

console.log(message)

Parameters

Name

Type

Description

message

string

Message to be logged.

Example

Dialog script
intent('Save this location as my $(L home|office|work) (address|)', p => {
    console.log(p.L.value);
    p.play('The location is saved');
})

console.error

Runtime version: v3.x or later

The console.error() function is used to write error messages to Alan AI Studio logs. With it, you can troubleshoot and debug your dialog script.

Syntax

Function syntax

console.error(message)

Parameters

Name

Type

Description

message

string

Message to be logged.

Example

Dialog script
try {
    // your code
}
catch (e) {
    console.error(e)
}